On Reverse Pinsker Inequalities
نویسنده
چکیده
New upper bounds on the relative entropy are derived as a function of the total variation distance. One bound refines an inequality by Verdú for general probability measures. A second bound improves the tightness of an inequality by Csiszár and Talata for arbitrary probability measures that are defined on a common finite set. The latter result is further extended, for probability measures on a finite set, leading to an upper bound on the Rényi divergence of an arbitrary non-negative order (including ∞) as a function of the total variation distance. Another lower bound by Verdú on the total variation distance, expressed in terms of the distribution of the relative information, is tightened and it is attained under some conditions. The effect of these improvements is exemplified.
منابع مشابه
IRWIN AND JOAN JACOBS CENTER FOR COMMUNICATION AND INFORMATION TECHNOLOGIES On Reverse Pinsker Inequalities
New upper bounds on the relative entropy are derived as a function of the total variation distance. One bound refines an inequality by Verdú for general probability measures. A second bound improves the tightness of an inequality by Csiszár and Talata for arbitrary probability measures that are defined on a common finite set. The latter result is further extended, for probability measures on a ...
متن کاملWeighted Csiszár-kullback-pinsker Inequalities and Applications to Transportation Inequalities
Abstract. We strengthen the usual Csiszár-Kullback-Pinsker inequality by allowing weights in the total variation norm; admissible weights depend on the decay of the reference probability measure. We use this result to derive transportation inequalities involving Wasserstein distances for various exponents: in particular, we recover the equivalence between a T1 inequality and the existence of a ...
متن کاملTrends to Equilibrium in Total Variation Distance
This paper presents different approaches, based on functional inequalities, to study the speed of convergence in total variation distance of ergodic diffusion processes with initial law satisfying a given integrability condition. To this end, we give a general upper bound “à la Pinsker” enabling us to study our problem firstly via usual functional inequalities (Poincaré inequality, weak Poincar...
متن کاملGeneralised Pinsker Inequalities
We generalise the classical Pinsker inequality which relates variational divergence to Kullback-Liebler divergence in two ways: we consider arbitrary f -divergences in place of KL divergence, and we assume knowledge of a sequence of values of generalised variational divergences. We then develop a best possible inequality for this doubly generalised situation. Specialising our result to the clas...
متن کاملSome inequalities for information divergence and related measures of discrimination
Inequalities which connect information divergence with other measures of discrimination or distance between probability distributions are used in information theory and its applications to mathematical statistics, ergodic theory and other scientific fields. We suggest new inequalities of this type, often based on underlying identities. As a consequence we obtain certain improvements of the well...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- CoRR
دوره abs/1503.07118 شماره
صفحات -
تاریخ انتشار 2015